Duality of Maximum Entropy and Minimum Divergence

نویسندگان

  • Shinto Eguchi
  • Osamu Komori
  • Atsumi Ohara
چکیده

We discuss a special class of generalized divergence measures by the use of generator functions. Any divergence measure in the class is separated into the difference between cross and diagonal entropy. The diagonal entropy measure in the class associates with a model of maximum entropy distributions; the divergence measure leads to statistical estimation via minimization, for arbitrarily giving a statistical model. The dualistic relationship between the maximum entropy model and the minimum divergence estimation is explored in the framework of information geometry. The model of maximum entropy distributions is characterized to be totally geodesic with respect to the linear connection associated with the divergence. A natural extension for the classical theory for the maximum likelihood method under the maximum entropy model in terms of the Boltzmann-Gibbs-Shannon entropy is given. We discuss the duality in detail for Tsallis entropy as a typical example.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Unifying Divergence Minimization and Statistical Inference Via Convex Duality

In this paper we unify divergence minimization and statistical inference by means of convex duality. In the process of doing so, we prove that the dual of approximate maximum entropy estimation is maximum a posteriori estimation. Moreover, our treatment leads to stability and convergence bounds for many statistical learning problems. Finally, we show how an algorithm by Zhang can be used to sol...

متن کامل

A Maximum Entropy/minimum Divergence Translation Model

I present empirical comparisons between a standard statistical translation model and an equivalent Maximum Entropy/Minimum Divergence (MEMD) model, using several diierent methods for automatic feature selection. Results show that the MEMD model signiicantly outperforms the standard model in test corpus perplexity, even though it has far fewer parameters.

متن کامل

A Maximum Entropy/Minimum Divergence Translation Model

I present empirical comparisons between a linear combination of standard statistical language and translation models and an equivalent Maximum Entropy/Minimum Divergence (MEMD) model, using several diierent methods for automatic feature selection. The MEMD model signiicantly outperforms the standard model in test corpus per-plexity, even though it has far fewer parameters.

متن کامل

Incorporating Position Information into a Maximum Entropy/Minimum Divergence Translation Model

I describe two methods for incorporating information about the relative positions of bilingual word pairs into a Maximum Entropy/Minimum Divergence translation model. The better of the two achieves over 40% lower test corpus perplex-ity than an equivalent combination of a trigram language model and the classical IBM translation model 2.

متن کامل

Extropy: Complementary Dual of Entropy

This article provides a completion to theories of information based on entropy, resolving a longstanding question in its axiomatization as proposed by Shannon and pursued by Jaynes. We show that Shannon’s entropy function has a complementary dual function which we call “extropy.” The entropy and the extropy of a binary distribution are identical. However, the measure bifurcates into a pair of d...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • Entropy

دوره 16  شماره 

صفحات  -

تاریخ انتشار 2014